Hosting docusaurus with domain ZN2D.com
The Fact : I have docusaurus running in a Docker container. I would like to do the same thing with Apache and host my own site.
The problem : How to get both containers to talk to each other and how to update everything locally and remotely (Through a git repo?)
Hypothese : There must be a way to get them to communicate
After searching for a while to find a way to expose or better yet, to get Docker containers to communicate between each other and for which, I didn't find a easy way to do this, I decided to change my approach.
Since I might be hosting different websites, the Docker approach seemed a little bit complicated...ish. So instead, I installed apache directly and now I'm hosting the built version on it.
Setting up an NVMe SSD on a raspberry pi 5
-
I decided to install Apache on my rpi 5 because it has a PCIe connection. That gives us multiple opportunities to setup extra features (If you look at what Jeff Geerling has been testing out : JG website). But for this project, i order a nvme hat to be able to use a SSD instead of a SDCard.
-
There's a couple of ways that I could do this : Image a SDCard and then copy that setup to the SSD or Image the SSD (Which I did).
- Using the Pi imager, I used the non desktop 64 bit version based of Bookworm.
- Once physically installed on the Pi, I booted from an SDCArd that has an OS and then changed the
BOOT_ORDER
# Edit the EEPROM config on the Pi
sudo rpi-eeprom-config --edit
# Change the BOOT_ORDER line to the following:
BOOT_ORDER=0xf416 # Need the 6 at the end cause that tells it to boot from NVMe first
Apache install
- Following the basic intructions of how to setup up Apache, here's what I did :
sudo apt install apache2 -y
- And just like that, basic website created. Now obviously, I need to create the proper files to host the domain for the website :
zn2d.com
<VirtualHost *:80>
ServerName zn2d.com
ServerAlias www.zn2d.com
ServerAdmin webmaster@localhost
DocumentRoot /var/www/html/website/build
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost> - The DocumentRoot folder needs to be created and the ownership must be transfered to
www-data
:sudo mkdir -p /var/www/website/build
sudo chown -R www-data:www-data /var/www/website/build
sudo chmod -R 770 /var/www/website/build
But now... how do I make sure that it stays updated since I'll be working at it from different locations and different computers?
- I finaly decided to go with this approach : I'll work my docusaurus files localy but push and pull everything from my Github repo. Then, I'll know that whatever is in the repo will be the latest version. Problems that I can foresee... I don't want to login on the Pi and manually pull the latest version and also, I'll need to build the docusaurus project everytime to get the static html file.
Docusaurus & Apache functionality, Git and script to be able to host
-
So, to be able to host your project, you need to build it localy :
npm run build
-
This command needs to be run within the folder where your project is :
EX: \my-website\docusaurus-project\
- That will generate a
\build\
folder where the html file needed will be located. This is fine but will not work with Apache since it needs to be located (unless changed) here/var/www/html/website/build
. So I decided to copy it to the right folder.
- That will generate a
-
Like mentionned previously, I need to pull the lastest version of the website from Git directly to the right folder :
EX: \my-website\docusaurus-project\
Knowing this, here's a script that I wrote to allow everthing to happen (Including a log):
#!/bin/bash
# Set variables
GIT_REPO="[email protected]:USERNAME/yourrepo.git" # Change this
PROJECT_DIR="/my-website/docusaurus-project" # Change this
BRANCH="main" # Change if needed
BUILD_DIR="$PROJECT_DIR/build"
DEPLOY_DIR="/var/www/html/website/build" # Change this to whatever you called it
LOG_FILE="/var/log/deploy.log"
# Ensure the script runs as root
if [[ $EUID -ne 0 ]]; then
echo "This script must be run as root!" | tee -a "$LOG_FILE"
exit 1
fi
# Start logging
echo "=== Deployment Started: $(date) ===" | tee -a "$LOG_FILE"
# Check if the repository directory exists, clone if missing
if [ ! -d "$PROJECT_DIR" ]; then
echo "Repository not found. Cloning..." | tee -a "$LOG_FILE"
git clone -b "$BRANCH" "$GIT_REPO" "$PROJECT_DIR" || { echo "Git clone failed"; exit 1; }
fi
# Navigate to the project directory
cd "$PROJECT_DIR" || { echo "Failed to access project directory"; exit 1; }
# Pull latest changes
echo "Pulling latest changes from branch $BRANCH..." | tee -a "$LOG_FILE"
git reset --hard
git clean -fd
git pull origin "$BRANCH" --force || { echo "Git pull failed"; exit 1; }
# Install dependencies and build project
echo "Installing dependencies and building project..." | tee -a "$LOG_FILE"
# npm install || { echo "npm install failed"; exit 1; }
npm run build || { echo "npm build failed"; exit 1; }
# Verify build directory exists
if [ ! -d "$BUILD_DIR" ]; then
echo "Build directory not found!" | tee -a "$LOG_FILE"
exit 1
fi
# Deploy build files to Apache folder
echo "Deploying build to $DEPLOY_DIR..." | tee -a "$LOG_FILE"
rsync -av --delete "$BUILD_DIR/" "$DEPLOY_DIR/" || { echo "Deployment failed"; exit 1; }
# Set correct permissions
chown -R www-data:www-data "$DEPLOY_DIR"
chmod -R 755 "$DEPLOY_DIR"
# Reload Apache service
echo "Reloading Apache service..." | tee -a "$LOG_FILE"
systemctl reload apache2 || { echo "Failed to reload Apache"; exit 1; }
echo "=== Deployment Completed Successfully: $(date) ===" | tee -a "$LOG_FILE"
Lets make a cronjob to automate this script
-
To make sure that the lastest version is being displayed, I created a cron job to execute the script twice a day :
sudo crontab -e
-
Add this line
0 6,18 * * * /usr/local/bin/deployweb3.sh >> /var/log/deploy.log 2>&1